345 research outputs found

    Master of Science

    Get PDF
    thesisWe present a framework for detecting possible adverse drug reactions (ADRs) using Utah Medicaid administrative data. We examined four classes of ADRs associated with treatment of dementia by acetylcholinesterase inhibitors (AChEIs): known reactions (gastrointestinal, psychological disturbances), potential reactions (respiratory disturbance), novel reactions (hepatic, hematological disturbances), and death. Our cohort design linked drug utilization data to medical claims from Utah Medicaid recipients. We restricted the analysis to beneficiaries 50 years and older who had a dementia-related diagnosis. We compared patients treated with AChEIs to patients untreated with antidementia medication therapy. We attempted to remove confounding by establishing propensity-score-matched cohorts for each outcome investigated; we then evaluated effects of drug treatment by conditional multivariable Cox-proportional-hazard regression. Acute and transient effects were evaluated by a crossover design using conditional logistic regression. Propensity-matched analysis of expected reactions found that AChEI treatment was associated with gastrointestinal episodes (hazards ratio [HR]: 2.02; 95% confidence interval [CI]: 1.28-3.2) but not psychological episodes, respiratory disturbance, or death. Among the tested unexpected reactions, the risk was higher with hematological episodes (HR: 2.32; 95% CI: 1.47-3.6) but not hepatic episodes. We also noted a trend towards an increase in the odds of experiencing acute hematological events in the treated group (odds ratio [OR]: 3.0; 95% CI: 0.97-9.3). We observed an expected association between AChEIs and gastrointestinal disturbances and detected a signal of hematological adverse drug events (ADEs) after treatment with AChEIs in this pilot study. Using our analytic framework may raise awareness of potential ADEs and generate hypotheses for future investigations

    Ground water and surface water under stress

    Get PDF
    Presented at Ground water and surface water under stress: competition, interaction, solutions: a USCID water management conference on October 25-28, 2006 in Boise, Idaho.Includes bibliographical references.The A&B Irrigation District in south-central Idaho supplies water to irrigate over 76,000 acres. The district's 14,660-acre Unit A is supplied with water from the Snake River. Unit B is comprised of 62,140 acres of land irrigated by pumping groundwater from the Eastern Snake Plain Aquifer (ESPA) using 177 deep wells. Pumping depths range from 200 to 350 feet. Water from Unit B wells is distributed to irrigated lands via a system of short, unlined lateral canals averaging about 3/4-mile in length with capacities of 2 to 12 cfs. During the period from 1975 to 2005, the average level of the ESPA under the A&B Irrigation District dropped 25 ft and as much as 40 ft in some locations. This has forced the district to deepen some existing wells and drill several new wells. To help mitigate the declining aquifer, the district and its farmers have implemented a variety of irrigation system and management improvements. Improvements have involved a concerted effort by the district, landowners, and local and federal resource agencies. The district has installed variable speed drives on some supply wells, installed a SCADA system to remotely monitor and control well pumps, and piped portions of the open distribution laterals. This has permitted farmers to connect farm pressure pumps directly to supply well outlets. Farmers have helped by converting many of their surface irrigation application systems to sprinklers, moving farm deliveries to central locations to reduce conveyance losses, and installing systems to reclaim irrigation spills and return flows

    Decade of SCADA implementation, A

    Get PDF
    Presented at Meeting irrigation demands in a water-challenged environment: SCADA and technology: tools to improve production: a USCID water management conference held on September 28 - October 1, 2010 in Fort Collins, Colorado.Irrigated agriculture began in southwest Idaho's Lower Payette Valley in the 1880's. By 1900, over 30,000 irrigated acres had been developed, served by a system of over 20 canals diverting natural flows. High springtime river flows were often reduced to a trickle by August. Two Bureau of Reclamation dams were built to provide supplemental storage and to bring another 53,000 acres into production. Like many early canal systems, the Payette Valley canals were built with only a few manually operated water control structures or water measurement devices. Diversions were difficult to control due to variable river flows and much water was wasted. Water rights were difficult to administer, due to the lack of accurate water measurement. In dry years there were often disputes among users on different canals as natural flows declined. In 1997, the first canal headworks in the Payette were automated, utilizing solar power and simple off-the-shelf components. The success of this single project encouraged more irrigation entities to improve water control capabilities utilizing SCADA. New control structures were built and automated and communication links were put in place to monitor canal operations and to update water accounting. Today, there are over 40 automated control gates, 14 telemetered water measurement sites, and 11 new water measurement structures. Diversion data daily and accurately account for water use in the basin. Telemetry has enabled canal operators to monitor facilities and to respond quickly to changing water needs or emergency situations. Canal systems in the valley are being operated more efficiently, reducing both diversion rates and operational spills. This more efficient operation has helped to improve water supply reliability. These changes have also served to bring a greater sense of cooperation to water users throughout the Payette Valley

    Internationales Product Management 2011 : Einsatz und Trends – Ergebnisse Schweiz

    Get PDF
    StudieIm Bereich Product Management führen wir 2011 erstmalig eine umfassende Studie zum Status Quo und den aktuellen Trends des Product Managements durch. Befragt werden Product Management-Entscheidungsträger aus der Schweiz sowie aus dem Ausland

    Factors associated with screening or treatment initiation among male United States veterans at risk for osteoporosis fracture

    Get PDF
    Male osteoporosis continues to be under-recognized and undertreated in men. An understanding of which factors cue clinicians about osteoporosis risk in men, and which do not, is needed to identify areas for improvement. This study sought to measure the association of a provider\u27s recognition of osteoporosis with patient information constructs that are available at the time of each encounter. Using clinical and administrative data from the Veterans Health Administration system, we used a stepwise procedure to construct prognostic models for a combined outcome of osteoporosis diagnosis, treatment, or a bone mineral density (BMD) test order using time-varying covariates and Cox regression. We ran separate models for patients with at least one primary care visit and patients with only secondary care visits in the pre-index period. Some of the strongest predictors of clinical osteoporosis identification were history of gonadotropin-releasing hormone (GnRH) agonist exposure, fragility fractures, and diagnosis of rheumatoid arthritis. Other characteristics associated with a higher likelihood of having osteoporosis risk recognized were underweight or normal body mass index, cancer, fall history, and thyroid disease. Medication exposures associated with osteoporosis risk recognition included opioids, glucocorticoids, and antidepressants. Several known clinical risk factors for fracture were not correlated with osteoporosis risk including smoking and alcohol abuse. Results suggest that clinicians are relying on some, but not all, clinical risk factors when assessing osteoporosis risk

    Addressing the Fertility Needs of HIV-Seropositive Males

    Get PDF
    An increasing number of serodiscordant couples are utilizing advanced reproductive technologies to address their reproductive needs. Recent literature has demonstrated that it is not only technically possible but also safe to utilize sperm-washing techniques to allow for the creation of embryos, thereby preventing both horizontal and vertical transmission of HIV. This article addresses the strengths and weakness of various reproductive techniques and discusses our experience at Columbia University (NY, USA), the location of the largest HIV-focused fertility program in the USA

    Statistically Adaptive Filtering for Low Signal Correction in X-ray Computed Tomography

    Full text link
    Low x-ray dose is desirable in x-ray computed tomographic (CT) imaging due to health concerns. But low dose comes with a cost of low signal artifacts such as streaks and low frequency bias in the reconstruction. As a result, low signal correction is needed to help reduce artifacts while retaining relevant anatomical structures. Low signal can be encountered in cases where sufficient number of photons do not reach the detector to have confidence in the recorded data. % NOTE: SNR is ratio of powers, not std. dev. X-ray photons, assumed to have Poisson distribution, have signal to noise ratio proportional to the dose, with poorer SNR in low signal areas. Electronic noise added by the data acquisition system further reduces the signal quality. In this paper we will demonstrate a technique to combat low signal artifacts through adaptive filtration. It entails statistics-based filtering on the uncorrected data, correcting the lower signal areas more aggressively than the high signal ones. We look at local averages to decide how aggressive the filtering should be, and local standard deviation to decide how much detail preservation to apply. Implementation consists of a pre-correction step i.e. local linear minimum mean-squared error correction, followed by a variance stabilizing transform, and finally adaptive bilateral filtering. The coefficients of the bilateral filter are computed using local statistics. Results show that improvements were made in terms of low frequency bias, streaks, local average and standard deviation, modulation transfer function and noise power spectrum

    MBIR Training for a 2.5D DL network in X-ray CT

    Full text link
    In computed tomographic imaging, model based iterative reconstruction methods have generally shown better image quality than the more traditional, faster filtered backprojection technique. The cost we have to pay is that MBIR is computationally expensive. In this work we train a 2.5D deep learning (DL) network to mimic MBIR quality image. The network is realized by a modified Unet, and trained using clinical FBP and MBIR image pairs. We achieve the quality of MBIR images faster and with a much smaller computation cost. Visually and in terms of noise power spectrum (NPS), DL-MBIR images have texture similar to that of MBIR, with reduced noise power. Image profile plots, NPS plots, standard deviation, etc. suggest that the DL-MBIR images result from a successful emulation of an MBIR operator
    corecore